Tangible Programming 1 Running head: TANGIBLE PROGRAMMING IN EARLY CHILDHOOD Tangible Programming in Early Childhood: Revisiting Developmental Assumptions through New Technologies
نویسندگان
چکیده
This chapter explores the idea that when given appropriate tools, young children can actively engage in computer programming and robotics activities in a way that is consistent with developmentally appropriate practice. In particular, this project proposes the use of emerging tangible user interface (TUI) technology to fundamentally re-envision the way in which children program computers. In short, rather than using a mouse or keyboard to write programs to control robots, children instead construct physical computer programs by connecting interlocking, “smart” wooden blocks. In this chapter we will describe our development efforts both in terms of curriculum and technology. We will also share results from a two-year design-based research study conducted in three kindergarten classrooms. Tangible Programming 3 Tangible Programming in Early Childhood: Revisiting Developmental Assumptions through New Technologies We are surrounded by technology. From pens and pencils to cell phones and digital cameras, technology permeates our existence. Yet, in the early grades, children learn very little about this. For decades early childhood curriculum has focused on literacy and numeracy, with some attention paid to science, in particular to the natural world—insects, volcanoes, plants, and the Arctic. And, while understanding the natural world is important, developing children’s knowledge of the surrounding man-made world is also important (Bers, 2008). This is the realm of technology and engineering, which focus on the development and application of tools, machines, materials, and processes to help solve human problems. Early childhood education hasn’t ignored this; it is common to see young children using cardboard or recycled materials to build cities and bridges. However, what is unique to our manmade world today is the fusion of electronics with mechanical structures. We go to the bathroom to wash our hands, and the faucets “know” when to start dispensing water. The elevator “knows” when someone’s little hands are in between the doors and shouldn’t close. Our cell phones “know” how to take pictures, send emails, and behave as alarm clocks. Even our cars “know” where we want to go and can take us there without getting lost. We live in a world in which bits and atoms are increasingly integrated (Gershenfeld, 2000); however, we do not teach our young children about this. In the early schooling experiences, we teach children about polar bears and cacti, which are probably more remote from their everyday experience than smart faucets and cellular phones. There are several reasons for the lack of focus on technologies in early childhood, but two of the most common claims are that young children are not developmentally Tangible Programming 4 ready to understand such complex and abstract phenomena, and that there is a lack of technology with age-appropriate interfaces that allow children to develop their own technologically-rich projects. In this chapter we address both of these claims by presenting research to evaluate young children’s ability to build, program, and understand their own robotic creations through the use of a novel programming interface designed specifically for young children Background Our work is rooted in notions of developmentally appropriate practice (DAP), a perspective within early childhood education concerned with creating learning environments sensitive to children's social, emotional, physical, and cognitive development. DAP is a framework produced by the National Association for the Education of Young Children (Copple & Bredekamp, 2009) that outlines practice that promotes young children’s optimal learning and development. DAP is based on theories of child development, the strengths and weaknesses of individual children uncovered through authentic assessment, and individual children’s cultural background as defined by community and family (Copple & Bredekamp, 2009). DAP is built upon the theory of developmental stages introduced by Jean Piaget, which suggests that children enter the concrete operational stage at age six or seven. According to Piaget, at this age, a child gains the ability to perform mental operations in his/her head and also to reverse those operations. As a result, a concrete operational child has a more sophisticated understanding of number, can imagine the world from different perspectives, can systematically compare, sort, and classify objects, and can understand notions of time and causality (Richardson, 1998). Based on this developmental model, one might make the argument that a child’s ability to program a computer might be predicted by his or her general developmental Tangible Programming 5 level, and, that by extension, a pre-operational kindergartener (typically five years old) may be too young to benefit from or understand computer programming. However, since its introduction, various problems and inconsistencies have been identified with Piaget’s stage model. For example, studies have shown that when a task and its context are made clear to children, they exhibit logical thought and understanding well before the ages that Piaget suggested as a lower limit (Richardson, 1998). In the early days of personal computing, there was lively debate over the developmental appropriateness of computer technology use in early elementary classrooms (Clements & Sarama, 2003). Today, however, the pressing question in no longer whether but how we should introduce computer technology in early elementary school (Clements & Sarama, 2002). For example, a 1992 study found that elementary school children exposed to exploratory software showed gains in self-esteem, non-verbal skills, long-term memory, manual dexterity, and structural knowledge. When combined with other non-computer activities, these students also showed improvements in verbal skills, abstraction, problem solving, and conceptual skills (Haugland, 1992). Other studies have demonstrated that computer use can serve as a catalyst for positive social interaction and collaboration (Clements & Sarama, 2002; Wang & Ching, 2003). Of course, the developmental appropriateness of the technology used by young children depends on the context. What software is being used? And how is it integrated with and supported by the broader classroom curriculum? Fortunately, we live in an advantageous time for introducing technology in early childhood. Given the increasing mandate to make early childhood programs Tangible Programming 6 more academically challenging, technology can provide a playful bridge to integrate academic demands with personally meaningful projects (Bers, 2008). Robotics and Computer Programming in Early Childhood Education “Computer technology” is a broad term that can mean many things, especially in the context of a classroom. We believe that robotics is one type of educational technology that holds special potential for early childhood classrooms where children engage in cognitive as well as motor and social skills development. Furthermore, in early childhood content areas tend not to be isolated, but integrated more broadly into classroom curriculum that encompasses different content and skills; learning can be project-driven and open-ended; and student work does not have to fit into an hour-long class period. Thus, robotics can be a good integrator of curricular content (Bers, Ponte, Juelich, Viera, & Schenker, 2002). Robotics provides opportunities for young children to learn about sensors, motors, and the digital domain in a playful way by building their own projects, such as cars that follow a light, elevators that work with touch sensors, and puppets that can play music. Young children can become engineers by playing with gears, levers, motors, sensors, and programming loops, as well as storytellers by creating their own meaningful projects that move in response to their environment (Bers, 2008; Wang & Ching, 2003). Robotics can also be a gateway for children to learn about applied mathematical concepts, the scientific method of inquiry, and problem solving (Rogers & Portsmore, 2004). Moreover, robotic manipulatives invite children to participate in social interactions and negotiations while playing to learn and learning to play (Resnick, 2003). 1 As of 2006, thirty-seven states have included engineering/technology standards in their educational frameworks. Tangible Programming 7 Robotics, however, is about more than just creating physical artifacts. In order to bring robots to “life” children must also create computer programs—digital artifacts that allow robots to move, blink, sing, and respond to their environment. Previous research has shown that children as young as four years old can understand the basic concepts of computer programming and can build and program simple robotics projects (Bers, 2008; Cejka, Rogers, & Portsmore, 2006; Bers, Rogers, Beals, Portsmore, Staszowski, Cejka, Carberry, Gravel, Anderson, & Barnett, 2006). Furthermore, early studies with the text-based language, Logo, have shown that computer programming, when introduced in a structured way, can help young children with variety of cognitive skills, including basic number sense, language skills, and visual memory (Clements, 1999). Nonetheless, computer programming is difficult for novices of any age. Kelleher and Pausch (2005) offer a taxonomy containing well over 50 novice programming systems, a great number of which aim to ease or eliminate the process of learning language syntax, perhaps the most often cited source of novice frustration. Beyond syntax, there are many specific conceptual hurdles faced by novice programmers as well as fundamental misconceptions about the nature of computers and computer programming. Ben-Ari (1998) points out that unlike the beginning physics student who at least has a naïve understanding of the physical world, beginning programmers have no effective model of a computer upon which to build new knowledge. Worse, rarely is an effort made to help students develop a working model. According to Norman (1986), the primary problem facing novice programmers is the gap between the representation the brain uses when thinking about a problem and the representation a computer will accept. Tangible Programming 8 In addition to the above challenges faced by novice programmers, we must also consider the developmental needs and capabilities of young children. McKeithen, Reitman, Rueter, and Hirtle (1981) conducted a study that explored the differences in the ability of expert and novice computer programmers to recall details of computer programs. In their analysis, they theorize that because novice programmers lack adequate mental models for programming tasks, they rely on rich common language associations for these concepts. For example, computer words like LOOP, FOR, STRING, and CASE have very different common language meanings. Reflecting on this result, it seems reasonable to expect that young children will have an especially difficult time building conceptual models for programming concepts because they have fewer mental schemas on which to build. Likewise Rader, Brand, and Lewis (1997) conducted a study with the Apple’s KidSim programming system in which 2/3 graders and 4/5 graders used the system for one year with minimal structured instruction. At the end of the year, the younger children had significantly more difficulty with programming concepts such as individual actions, rule order, and subroutine. However, the authors suggest that with structured instruction of programming concepts, the young children would have developed a much better understanding of the system. However, regardless of the way in which the content is presented, most current programming environments are not well-suited for very young children. One problem is that the syntax of text-based computer languages, such as Logo, can be unintuitive and frustrating for novice programmers. This is exacerbated for young children who are still learning how to read. Modern visual programming languages such as ROBOLAB allow children to program by dragging and connecting icons on the computer screen. And, while this approach simplifies 2 http://www.legoengineering.com/ Tangible Programming 9 language syntax, the interfaces require young children to use a mouse to navigate hierarchical menus, click on icons, and drag lines to very small target areas on a computer screen. All of this requires fine motor skills that make it difficult for young children to participate (Hourcade, Bederson, Druin, & Guimbretière, 2004). As a result, adults often have to sit with young children and give click-by-click instructions to make programming possible, which poses challenges for children’s learning (Beals & Bers, 2006). It also makes it difficult to implement computer programming in average schools, where there are often only one or two adults per twenty-five children. Attempts have been made to create simpler versions of these languages. However, the resulting interfaces often obscure some of the most important aspects of programming, such as the notion of creating a sequence of commands to form a program’s flow-of-control. In the next section we will present work on tangible computer programming that explores new interfaces to address some of the challenges presented here. Tangible Computer Programming With the emergence of tangible user interface technology (TUI), we have a new means to separate the intellectual act of computer programming from the confounding factor of modern graphical user interfaces. And, with this, we have an opportunity to build a much better understanding of developmental capabilities of young children with respect to computer programming. Just as young children can read books that are appropriate for their age-level, we propose that young children can write simple but interesting computer programs, provided they have access to a developmentally-appropriate programming language. Indeed, in our own extensive previous work with robotics and young children, we observed that when presented Tangible Programming 10 with new technologies that make use of well-established sensori-motor skills, young children are able to display complex mental operations. We believe that we can overcome the inherent limitations of modern desktop and laptop computers by doing nothing short of removing them from children’s learning experiences. Thus, rather than write computer programs with a keyboard or mouse, we have created a system that allows children to instead construct physical computer programs by connecting interlocking wooden blocks (see Figure 1). This technique is called tangible programming. Figure 1. A tangible programming language for robotics developed at Tufts University. With this language, children construct programs using interlocking wooden blocks. A tangible programming language, like any other type of computer language, is simply a tool for telling a computer what to do. With a text-based language, a programmer uses words such as BEGIN, IF, and REPEAT to instruct a computer. This code must be written according to strict, and often frustrating, syntactic rules. With a visual language, words are replaced by pictures, and programs are expressed by arranging and connecting icons on the computer screen Tangible Programming 11 with the mouse. There are still syntax rules to follow, but they can be conveyed to the programmer through a set of visual cues. Instead of relying on pictures and words on a computer screen, tangible languages use physical objects to represent the various aspects of computer programming. Users arrange and connect these physical elements to construct programs. Rather than falling back on implied rules and conventions, tangible languages can exploit the physical properties of objects, such as size, shape, and material to express and enforce syntax. For example, the interlocking wooden blocks shown in Figure 1 describe the language syntax (i.e. a sequential connection of blocks). In fact, with this language, while it is possible to make mistakes in program logic, it is impossible to produce a syntax error. In moving away from the mouse-based interface, our pilot studies, conducted in public schools in the Boston area, have suggested that tangible languages might have the added benefit of improving both the style and amount of collaboration occurring between students. And, since the process of constructing programs is now situated in the classroom at large—on children’s desks or on the floor—children’s programming work can be more open and visible and can become more a part of presentations and discussions of technology projects. Likewise, physical programming elements can be incorporated into whole-class instruction activities without the need for a shared computer display or an LCD projector. Tangible Programming 12 Figure 2. This picture taken from a pilot study show kindergarten students using the tangible programming language developed at Tufts to program a robot to act out a short story. The idea of tangible programming was first introduced in the mid 1970’s by Radia Perlman, then a researcher at the MIT Logo Lab. Perlman believed that the syntax rules of textbased computer languages represented a serious barrier to learning for young children. To address this issue she developed an interface called Slot Machines (Perlman, 1976) that allowed young children to insert cards representing various Logo commands into three colored racks. This idea of tangible programming was revived nearly two decades later (Suzuki & Kato, 1995), and since then a variety of tangible languages have been created in a number of different research labs around the world (e.g., McNerney, 2004; Wyeth, 2008; Smith, 2007). In almost all cases, the blocks that make up tangible programming languages contain some form of electronic components. And, when connected, the blocks form structures that are more than just abstract representations of algorithms; they are also working, specialized computers that can execute algorithms through the sequential interaction of the blocks. Tangible Programming 13 Unfortunately, the blocks that make up these languages tend to be delicate and expensive. And, as a result, tangible programming languages have seldom been used outside of research lab settings. At Tufts we have taken a different approach (Horn & Jacob, 2007). Programs created with our language are purely symbolic representations of algorithms—much in the way that Java or C++ programs are only collections of text files. An additional piece of technology must be used to translate the abstract representations of a program into a machine language that will control a robot. This approach allows us to create inexpensive and durable parts, and provides greater freedom in the design of the physical components of the language. Our current prototype uses a collection of image processing techniques to convert physical programs into digital instructions. For the system to work, each block in the language is imprinted with a circular symbol called a TopCode (Horn, 2008). These codes allow the position, orientation, size and shape, and type of each statement to be quickly determined from a digital image. The image processing routines work under a variety of lighting conditions without the need for human calibration. Our prototype uses a standard consumer web camera connected to a laptop computer. Children initiate a compile by pressing the spacebar on the computer, and their program is downloaded onto a robot in a mater of seconds. Tangible Programming Curriculum for Kindergarten Research has shown that mere exposure to computer programming in an unstructured way has little demonstrable effect on student learning (Clements, 1999). For example, a 1997 study involving the visual programming language, KidSim, found that elementary school students failed to grasp many aspects of the language, leading the authors to suggest that more explicit instruction might have improved the situation (Rader et al., 1997). Therefore, an Tangible Programming 14 important aspect of our work is to develop curriculum that utilizes tangible programming to introduce a series of powerful ideas from computer science in a structured, age-appropriate way. The term powerful idea refers to a concept that is at once personally useful, interconnected with other disciplines, and rooted in intuitive knowledge that a child has internalized over a long period of time (Papert, 1991). We introduce these powerful ideas in a context in which their use allows very young children to solve compelling problems. Table 1 lists example powerful ideas from computer programming that we have selected to emphasize in our curriculum using tangible programming. Based on these powerful ideas we developed a preliminary 12-hour curriculum module for use in Kindergarten classrooms that introduces a subset of these concepts through a combination of whole-class instruction, structured small-group challenge activities, and open-ended student projects. We piloted this curriculum in three Kindergarten classrooms in the Boston area, using the tangible programming blocks described above and LEGO Mindstorms RCX construction kits. Initially, we provided students with pre-assembled robot cars to teach preliminary computer programming concepts. As the unit progressed, students disassembled these cars to build diverse robotic creations that incorporated arts and craft materials, recycled goods such as cardboard tubes and boxes, and a limited number of LEGO parts. Table 2 provides a brief overview of the activities that we included in the curriculum. Tangible Programming 15 Table 1 Powerful ideas from computer programming and robotics emphasized in our curriculum Powerful Idea Description Computer Programming This is the fundamental idea that robots are not living things that act of their own accord. Instead, robots act out computer programs written by human beings. Not only that, children can create their own computer programs to control a robot. Command Sequences & Control Flow The idea that simple commands can be combined into sequences of actions to be acted out by a robot in order. Loops The idea that sequences of instructions can be modified to repeat indefinitely or in a controlled way. Sensors The idea that a robot can sense its surrounding environment through a variety of modalities, and that a robot can be programmed to respond to changes in its environment. Parameters The idea that some instructions can be qualified with additional
منابع مشابه
Robotics in the early childhood classroom: learning outcomes from an 8-week robotics curriculum in pre-kindergarten through second grade
In recent years there has been an increasing focus on the missing ‘‘T’’ of technology and ‘‘E’’ of engineering in early childhood STEM (science, technology, engineering, mathematics) curricula. Robotics offers a playful and tangible way for children to engage with both T and E concepts during their foundational early childhood years. This study looks at N = 60 children in pre-kindergarten throu...
متن کاملDancing robots: integrating art, music, and robotics in Singapore’s early childhood centers
In recent years, Singapore has increased its national emphasis on technology and engineering in early childhood education. Their newest initiative, the Playmaker Programme, has focused on teaching robotics and coding in preschool settings. Robotics offers a playful and collaborativeway for children to engagewith foundational technology and engineering concepts during their formative early child...
متن کاملThe Effect of a Classroom-Based Intensive Robotics and Programming Workshop on Sequencing Ability in Early Childhood
This paper examines the impact of programming robots on sequencing ability during a 1-week intensive robotics workshop at an early childhood STEM magnet school in the Harlem area of New York City. Children participated in computer programming activities using a developmentally appropriate tangible programming language CHERP, specifically designed to program a robot’s behaviors. The study assess...
متن کاملCognitive Dimensions of Tangible Programming Languages
We have in the past described several approaches to programming with tangible interfaces – both specific programming languages implemented in physical form, and physical interfaces (for example to domestic appliances) that have some programmable capabilities. In this paper, we investigate the more generic properties of tangible interfaces as a basis for programmable functionality. The specific ...
متن کامل‘‘I want my robot to look for food’’: Comparing Kindergartner’s programming comprehension using tangible, graphic, and hybrid user interfaces
In recent years, educational robotics has become an increasingly popular research area. However, limited studies have focused on differentiated learning outcomes based on type of programming interface. This study aims to explore how successfully young children master foundational programming concepts based on the robotics user interface (tangible, graphical, hybrid) taught in their curriculum. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009